en
AI Ranking
每月不到10元,就可以无限制地访问最好的AIbase。立即成为会员
Home
News
Daily Brief
Income Guide
Tutorial
Tools Directory
Product Library
en
AI Ranking
Search AI Products and News
Explore worldwide AI information, discover new AI opportunities
AI News
AI Tools
AI Cases
AI Tutorial
Type :
AI News
AI Tools
AI Cases
AI Tutorial
2023-09-11 15:18:27
.
AIbase
.
1.3k
ELYZA Releases Japanese LLM Based on Llama 2, with 7 Billion Parameters, Competing with GPT-3.5
ELYZA has released the Japanese LLM 'ELYZA-japanese-Llama-2-7b' based on Meta's Llama 2, with a parameter count reaching 7 billion, performance comparable to GPT-3.5. The model has undergone additional pre-training and unique post-training, achieving the highest scores in a 5-level manual evaluation. Although it has not yet reached the level of closed LLMs, it is already on par with GPT-3.5. ELYZA has successfully developed LLMs in other languages, including English.